Intelligence Analysis: Continuity and Change

Dr. Thomas Fingar, the top analyst in the U.S. intelligence community, discusses post-9/11 intelligence reforms, in particular a process he calls "analystic transformation." Fingar says the analytic capabilities of his staff are hampered by an excess of new recruits and a shortage of mid-career analysts. However, new tools, such as the much touted "intellipedia" have allowed analysts to leverage more information from existing resources.

Fingar says the decision to declassify key judgements from the Iran national intelligence estimate was influenced in part by the poor handling of the prewar estimate on Iraq. While he would have preferred not to declassify the Iran estimate, Fingar says the politicized atmosphere made it necessary.

JOSEPH J. HELMAN:  As you all gather your food and grab your seat, I'd like to welcome you to today's meeting at the Council on Foreign Relations.  I'm Joe Helman, serving as this year's national intelligence fellow at the council.  This is the third meeting of the series on national intelligence.

Before we begin, I'd like to remind you to please turn off your cell phones and all other electronic devices.

We are not under standard council rules today.  Today's meeting is on the record.  Let me repeat that.  Today's meeting is very much on the record.  Please keep this in mind when we open the meeting up for questions.

We are fortunate to be joined today by Tom Fingar, who since May 2005 has served as the deputy director for National Intelligence for Analysis and chairman of the National Intelligence Council.  You have his biographical information in today's program.

Dr. Fingar has 27 years' experience as an intelligence professional.  In his current position, his responsibilities include enhancing analytic support to intelligence consumers, and he manages the production of National Intelligence Estimates and the President's Daily Brief.  He is a leader and innovator in intelligence reform, which I know he will speak to today.

It is a privilege and a pleasure to welcome Tom Fingar to the Council on Foreign Relations.  (Applause.)

C. THOMAS FINGAR:  Thank you, Joe.  Thank you, Richard, for inviting me to come up here to speak.  Thank you all for giving up a part of your day.

I guess I owe a special thanks to Joe for dropping 10 years off my experience, which I interpret as 10 years off my age.

What I'd like to do today, in 20 minutes or so, is kind of a blitz-like run through some of the things that we are doing under the rubric of analytic transformation, which is one component of the changes in the intelligence community that were made both necessary and possible by the 9/11 commission, the WMD commission report and the IRTPA legislation that created us. 

And I view this as a once in a couple of generation(s) opportunity to tackle problems and attempt changes that very few people have had a chance to undertake, changes that are necessary.

And in a somewhat perverse way, we've got a situation in which people around the top of the intelligence community have worked with one another, known one another, mostly respected and mostly liked one another for several decades.  And we do have a sense of they've -- they've put the inmates in charge of the asylum for a while.  And what are we going to do now that we've been given the opportunity to fix those things that we've just complained about for a long time? 

As a mnemonic device, as I run through this very quickly, with very few examples or illustrations, I'd like to touch upon elements of the job which I would summarize under the headings of task, talent, tradecraft, techniques and tools; with our discussion at the end, if I haven't exceeded my allotted time, (on some of the balance sheets ?), where I think we are in this process.

Let me turn first to task.  To me, it's very simple.  Improve analysis.  The goal is improvement.  It's not transformation, per se.  We're not making changes for the sake of changes.  We're making changes in order to be able to provide better, more insightful, more timely, more targeted analytical support to all of the customers and organizations that we support.  The goal here for analysts is not to make themselves smarter.  It's to make policy better; military decisions more effective, more appropriate; and now -- new to us -- the law enforcement first responder community more able to take advantage of the resources and techniques of the foreign intelligence world and vice versa. 

To improve analysis, we need better analysts.  We have highly motivated, highly professional people.  We also have a youthful workforce, and I'll come back to that.  We need better training, better inculcation of skills, better use of the talent that we've got. 

We need to produce better products.  The critiques of the 9/11 but especially the WMD commission both focused on a single problem and motivated by probably the single worst product generated during my career.  The need to produce analytical reports, analytical assessments that address real issues, provide insight, enhance understanding and improve decision-making, I think, is readily apparent, and nobody should argue contrary to that. 

All of this is aimed at providing better support.  The American taxpayer spends a great deal of money on intelligence.  The purpose of all of this is not to make us smarter.  It's to make the performance of our government officials and agencies better.  The task to do this involves better integrating the community.  We've got a lot of organizations that grew up independently, for reasons, in retrospect, that are actually pretty good, pretty sound.

We need better communication, of what it is we're doing, but more importantly better communication, better understanding of what those we support really need.  Annual work plans that grind out products to be delivered sort of as available isn't particularly responsive to a decision-making agenda that is measured in hours and days and is highly variable. 

We need to provide better guidance to the collectors.  The analytic profile and modus operandi, for as long as I've been in the community, has been a Dear Santa exercise:  I want everything; if I can't have everything, here's a list of 100 things that would be nice; if you can only get me 4 or 5 of them, I'll be happy because I didn't have them before.  We're trying to change that paradigm from one in which the analysts have to be encountering Aladdin and the lamp; you get three wishes here. 

What are the things that one wants to know?  What are the issues to be understood?  What's going to give you the most insight, the most understanding of a problem?  Focus on and sharpen those questions but you're not done yet as an analyst. 

Before we go to the collectors and say, go get it for me, you say, here's where I think you ought to look.  We ought to know enough about the issues we're investigating to provide guidance on where to look for the answer.  And this has to be dynamic.  Most of all, we have to make better use of our resources, not just the people, what we already know. 

We know, or at least we have amassed, a tremendous amount of information, a very large body of analytic products.  Outside of the intelligence community, there is all of the United States government.  And outside of that, there is all of the United States and all of the sectors and all of the libraries, all of the knowledge.  And then go beyond our borders. 

Knowing what we already know and mining what we already have more effectively before run great expense and risk in order to get more, and making much better, more efficient use of freely available open source information, bringing the world, the real world, the way everybody else operates, much more closely integrated into the world of intelligence analysis.  In order to do this effectively, we've got to address our people, the talent challenges. 

Here, what we're trying to do is to build a community of analysts where the organizing rubric is, who can help me solve my problem?  Who can help me answer my question?  Who can I help?  Not within a single organization, not based on kind of a narrow definition of competitiveness -- this agency's better than that agency -- but that looks across the entire community and stitches up people willing, able and really eager to work together on problems. 

I mentioned the demography.  Our community has a serious gray-green divide.  Roughly 55 percent of all analysts in the intelligence community joined after 9/11.  To put that in perspective, half of my analysts are working to provide insights to an administration that has been in office longer than they have been on the job.  The bar is pretty high. 

At the other end, there's a relatively small number of baby boomer veterans who measure their future in the community in terms of tuition checks yet to write.  We're very fortunate that many are staying longer than they have to.  They do it out of commitment and professionalism. 

The shape of this is the letter J, with that long leg of the new people and the short leg of the baby boomers.  And in between, that U trough, are the people not hired, the results of downsizing, rightsizing, peace dividends, new opportunities, outsourcing of skills that appeared. 

So that critical generation of people with sort of 15 years experience that we need, to mentor, that know the business to bring people up the learning curve quickly, is a missing link.  We have relatively few, and they are seriously overstretched. 

So we've got to find ways to deal with this that allow our seasoned veterans to mentor across agencies, that will bring younger analysts up the learning curve quickly enough to begin moving into first line managerial positions in another year or so. 

We also have to take better advantage than we have of the distributed expertise in the community.  If one was truly to sit down, with a blank piece of paper, and design the intelligence community today, it probably wouldn't look the way it does: 16 agencies, actually more analytic components than that. 

But as we looked at this and had two broad models, one was a centralizing model:  Take people away; let's consolidate them, shorten the communication to overcome the cultural problems and work together.  We decided that was a bad model in most instances because the way the community has grown up, it's been in response to specific missions. 

So we have agencies, supporting different, often unique customer sets, with very different missions that have developed very different kinds of expertise.  And the task we have is to harness that expertise wherever it is, to take advantage of synergies, to recognize where there are redundancies that are appropriate and where there are redundancies that are unnecessary, making the entire apparatus much more transparent than it has been. 

So we're going to lash up the analysts and the experts.  We needed to know something that nobody knew in the community when the ODNI was established.  Who is doing what?  What did analysts know?  Where did they work?  What did they work on?  Who did they work for?  What did they produce?  Now agency by agency, people had a pretty good idea.  But across the enterprise as a whole, nobody had asked those questions.  There was a very synthetic budget exercise, but it isn't -- it wasn't very helpful to be a manager and looking at that.

I'll come back to that.

Tradecraft.  If we're going to do all of this better, there's no substitute for sound tradecraft, good analytic methodologies, rigorous adherence to the laws of evidence and inference. 

There are a couple of parts to this.  One is the articulation and enforcement of standards.  We didn't have them before.  The law required me to establish them.  We have.  It was done, as many things have been, through the often painful process of representatives of very different agencies and communities coming together to decide what we needed. 

But the standards for analytic tradecraft have now been issued.  Products across the community are being evaluated against those standards.  We did comparisons, one against another.  We can identify strengths -- relative strengths and weaknesses, common pathologies that need attention.  That can inform training.  It can identify sort of weak managers that need to be perhaps helped a little bit.

We're trying to this point to run this not as a report card where somebody gets an A and somebody else gets a D.  We're trying to do with the continuous feedback mechanism.  So notionally here, to continue the metaphor, by the end of the semester, everybody gets an A.  It'll be a while before everybody gets an A.

Having said that, my job in transforming the analytical community is actually much easier than that of some of my counterparts in other dimensions working collection, working integration with the law enforcement community and the like, because analysis was never as bad as we were depicted as being, in that Iraq WMD estimates should be thought of as the equivalent of your yearbook photo on the worst bad hair day ever.

Analysts across the community, in every discipline, working every problem, every country got tarred with the same brush of incompetence.  It wasn't true. 

So I'm not trying to pick people up from the subbasement and raise them up higher.  The standard of professionalism and competence was actually pretty good.  So we're making something that was pretty good much better.  That's the goal -- to make it as good as the American public deserves.

A portion of the tradecraft involves transparency.  Show and share the homework.  As in scientific kind of exercises, we now insist that arguments and evidence be laid out in a way that somebody cannot only know exactly what you looked at, what your assumptions were, how you weighted evidence and got to the judgments that you had, so it's reproducible, or it can be challenged because you didn't include a certain body of information or a different assessment -- an assumption can be used to close an information gap. 

Part of this is being done because it's good tradecraft.  Part of it is being done to train our younger workforce.  Part of it is being done to rebuild confidence in the intelligence community, confidence in our capabilities.  We need to lay it out very meticulously and show people, not expect them to trust us. 

And finally, we need to teach people.  What we do is too important to rely on on-the-job training.  The vast majority of it is no different than sound analytical methodologies in any field or discipline, but it's more efficient, more effective to teach people than it is to critique them after they have done it wrong.

Part of the teaching, though, that is different involves facilitating, recognizing and rewarding collaboration.  We've got a lot of smart people, but none of them are smart enough by themselves to adequately address the array of very complex fast-moving issues that we're asked to analyze, that we're -- on which we're asked to provide insight.  It's the classic "Two heads are better than one."  Two perspectives or more perspectives are likely to identify flaws in the argument and so forth.

This is an absolutely innate approach of our younger workforce, sort of the digital generation that reads e-mail, talks on a phone, watches television, has a conversation by multitasking at the same time, you know, linked up to people around the world.  For them, it is the natural way to operate.  For the intelligence community, it is somewhere between heretical and previously deemed impossible, but we have to teach people to collaborate and break down the impediments to doing so.

Some of the key techniques that we are using to empower and unleash the potential of the workforce, of the community, as an organization.  Some of these are bottom-up.  You spend much of the first year going around the community, to analysts and agencies.  What impedes you from doing your job?  What do you need to do a better job?  What are the issues that I should take on?  What are the windmills in need of jousting?  What is it that will make the biggest impact?  So we got the bottom-up flow.

The other is top-down; that, as I mentioned earlier, folks at the top of the intel community have known one another for a long time.  We're there in part because, I suppose, we're the last ones standing, but in part it is, we've learned something along the way and have the ability to identify best practices, are open to ideas, suggestions from outside of community, receptive to ideas from respected colleagues who may be in a different part of the bureaucracy, and adapting and clarifying what those best practices are -- again, the transparency, making them known, making it easier for people to choose from a menu of things that work, rather than prescribing a single formula for addressing problems.

The law that created us mandated joint duties.  Modeled on Goldwater-Nichols for the military, it's kind of a rough analogy, but the idea here is that people will be better able to collaborate if they actually walk in somebody else's shoes, experience what it's like to work in that organization, establish a network of friends and colleagues in a different part of the organization that they can activate, utilize day in and day in and day out to do the job better. 

What we're trying to do to build the expertise, to build the sense of community to satisfy the joint duty requirements, is to move towards a reasonably organized set of options that will be career tracks:  each move that sort of utilizes and builds expertise and then moves on to a higher level of responsibility requiring a higher level of knowledge and a more effective network of contacts.

I decided to take advantage of the multiple elements in my job jar to establish some demonstration test bed -- ways to try out what we're doing and make people follow them -- that most of the community -- most of the approach is sort of voluntary, bring people along, build consensus behind it, I believe a necessary way.

But I directly -- I am directly responsible for two reasonably important sectors of activities.  One is the president's daily brief, and the other is the National Intelligence Council, which produces estimates and other products.  In these, I could simply -- since I'm in charge -- specify what the new rules and the procedures would be.  The goal was not to feel like the emperor.  It was to have fora in which there was cross-community participation both for the PDB and for the NIC products.  The standards have always been very high.  Bring people in, have them experience it, go back to their agencies having experienced, rather than simply heard about other ways of doing it.  I'll come back to this, but I think we are readily acknowledged to have had a very big impact on both of these, the PDB and the NIC products.

Make it possible to discover what we already know.  It's been awfully serendipitous.  People had their shoebox full of notes, people had their Rolodex full about the people that they know, people have it in their head -- I remember seeing a really good piece -- I can't remember who wrote it, it was about six months ago.

We're doing a number of things to make this easier.  One is called the Library of National Intelligence.  CIA is the executive agent.  It's now been up and ingesting documents for three months.  It's now ingesting about 2,500 to 3,000 per day.  We'll ingest all disseminated products -- collection products, analytic products -- and once they're in there, you can do all kinds of things.  The Amazon -- "the last person that asked for this report also found these useful."  "This report was used in the following analytic products."  That -- the ability to see if we already have the answers to questions; to look across products done in different agencies at different times and, you know, compare them on locational data, time sequence data, and see what has changed.

We've used in a competitive process that used to have -- we took over, there were 11 proposals nominated.  In the last year, it was 113.  One of the requirements is people have to be from more than one agency.  They have to -- the proposal has to utilize more than one type of intelligence -- signals intelligence, geospatial intelligence, human intelligence.  And the success rate in this is better than 50 percent.  And almost all of them had been problems that somebody had determined were unsolvable -- locating underground facilities, tracking foreign leaders -- that -- really remarkable, utilizing mathematical algorithms, running it against historical data, mining what we already know, outreach, networks of experts.

I spent a long time in INR, which is a little organization.  But the State Department's not little, and the intelligence community's not little.  And knowing who was working on your problem, whether they were in your bureau, in the department, in the intelligence community -- and because of the State Department's conferencing seminar program -- the last year I was assistant secretary, we did 100 -- I'm sorry, 280 seminars -- (inaudible) -- we'll reach out to the experts and bring them in, making that a community-wide approach.

Alternative analysis.  Having taught graduate students for many years, you drive them nuts by taking their set of facts, rearranging them and coming out with an alternative explanation.  It's sort of an automatic thing to do.  We need to be doing it.  Every analyst needs to be doing it.  Multiple hypotheses, rather than being lawyers that have a bottom line and marshal the evidence to prove the case.

Alternative analysis is a deliberate choice, not competitive analysis, which is what is written into some of the old executive order and statues.  They were intended to mean the same thing -- different set of eyes, fresh view -- but over time, the competitive analysis in the intelligence community turned out to be an early 20th century Pulitzer versus Hearst, scooping one another -- kind of a rather silly exercise at becoming the first one to misinform policymakers.  We want people to get it right, not be there first.

A couple words on tools, and then I will bring this to a close and take your questions.

Tools is a bad work, a dirty word in the analytic community.  There have been so many false promises.  There have been so many versions of a -- here's a new cell phone that does 46 more things.  You probably have no use for 38 of them, but we want you to have them.  We're trying to get tools that people actually want and are developing them out of the analytic side.  And the use of them is not a diversion, it's a way to do work, a way to do business.

Let me just mention a few.  We didn't know who did what in the intelligence community.  Probably 10 years ago, my good friend John Gannon came up with the idea of a database -- an analytic resource catalog -- who knows what language, served in what countries, what kind of experience and so forth, so that you could find people.  John couldn't get it started.  Sort of the -- there were a lot of reasons why this was a bad idea.  I thought it was a good idea.  Mark Lowenthal actually got it started, but nobody would put data into it because it was billed as a "This is where we'll go if we need to build a task force.  We needed a Serbo-Croat speaker to send out to East Armpit."  And people didn't put that data in.  Others -- managers saw it as a free-agent list.  If I advertise what talent I've got, somebody -- probably Fingar in INR -- will try and steal it.

I approached -- as I said, this is a mixed database of expertise.  If you're not in it, it means one of two things:  You don't know anything, or your boss thinks you don't know anything.  If you don't know anything, you're not writing for the president, you're not briefing on the Hill, you're not participating in National Intelligence Council activities.

I said this to a group of community seniors about as large as this group.  Sitting next to me was Pat Kennedy, who was the deputy for Management at the time.  Pat says, "One more thing.  If you're not in Fingar's database, you're not in a funded position."

We're reconciling the budget documents that go to the Hill through this database.  I suddenly discovered I had 1,200 more analysts than I knew I had, even by estimating.  But we can now reproduce phone book, e-mail directories.  If you need to find an expert on economics in the Andean region, you can find out where they are, how to contact them, and people are using it.

I mentioned the library.  Intellipedia.  It's been written up.  It's the Wikipedia on a classified network, with one very important difference:  it's not anonymous.  We want people to establish a reputation.  If you're really good, we want people to know you're good.  If you're making contributions, we want that known.  If you're an idiot, we want that known too. 

This started out as sort of we built it.  We had some assumptions about the digital generation.  Would they come?  We've now got over 35,000 registered users.  I have about 18,000 analysts.  So there's people in this way beyond it. 

It has become the vehicle of choice.  For some in collaboration, you can stand up portals in a matter of hours to deal with crises.  It took two months less on Intellipedia for this closed community to reach a million edits than it took Wikipedia on the Internet to hit a million edits.  Now that's because of skills learned outside on the Internet being brought in.  It's become a very, very effective way to do things. 

The final tool I'll mention is A-Space, for Analyst Space.  It's been compare to MySpace, but that's not right.  The idea is a digital workspace where people around the community, located around the U.S., located overseas, Baghdad, Kabul, can share information, share work in progress, tee up questions, draw upon databases.

This is also not just an idea; it's up in prototype version, ingesting data.  We've got 250 sort of early adapters trying to beat the crap out of it and tell us what they really need.

In the first iteration of this, we got 170 complaints and suggestions.  We adopted over a hundred of them.  And everybody that complained was put on the panel to design A-Space.  Nobody objected to that.  They took this seriously, as a serious effort to build something that incorporated the requirements that would help them to do the job.

Let me close now with sort of where I think we are.  I think it's an obtainable objective within the analytic transformation (range ?) by the end of this year to have the changes sufficiently locked in, sufficiently clear, institutionalized; that there will be very little interest in dismantling them.  They will have proven their utility.  And to hedge it, they'll be sufficiently cemented in that even if people are inclined to change them, they'll have to blast them out.

The training, the evaluation, the judgments of the improvement in the quality of the support we are providing sort of is a daily reinforcement for what we are doing.

Do we have it completely right?  Of course not.  This is going to take probably another several years to get everything really smoothly in place. But I think we're on the right track, and after now 37 years of being in and around the community, I'm as excited about what we're doing now as anything in my career.

Let me stop and invite questions.  And Joe, I guess you moderate.

HELMAN:  Thanks.  First of all, thank you very much for the view of the landscape of intelligence analysis reform.  I suspect when we open the meeting for questions, there will be at least a couple of NIEs.  So let me start there.

The process -- the decision and the process to declassify the released key judgments -- how did that come about?  And how do you see that taking place with future NIEs, if at all?

FINGAR:  I don't like to release unclassified key judgments.  One of the aspects of the highly politicized environment in Washington was, as we began to build -- rebuild confidence in sort of the analytic community, intelligence community, the Congress began to write into law requirements to produce unclassified versions of key judgments.  We pushed back on all of these successfully -- on most of them, not all, on some with a waiver.  Perceived as sort of an honest observer -- I like being perceived as honest, but I'd rather not produce things that are unclassified. 

The first one that we did was on Iraq -- and a great deal of congressional pressure to do this; frankly, a great amount of concern that it would leak or be leaked, and that we would have no control of what was out there.  And it was -- this was a year ago January -- a very high period of interest.  We decided to do the unclassified version of the key judgments. 

Six months later, when it was time for an update, same kind of pervasive attitude.  We decided to do it again.

We decided on the Iran estimate that we would not do unclassified key judgments.  The nature of the sourcing makes it a -- it was a hard topic to explain.  So we weren't going to do this. 

In October of last year DNI McConnell issued guidelines for declassification.  It is the policy of the DNI not to produce unclassified key judgments.  That's the starting point. 

We lived in Washington.  There are policies, and there are needs to accommodate.

What flowed below that were steps and requirements, and I'd boil them down to if you're going to declassify, you have to do it all, as it was in the original, subject to what you have to take out to protect sources and methods -- no spinning, no cherry-picking, no perception of cherry-picking on this. 

And history matters here, because part of what made it possible for me to be in this job was the Iraq WMD.  Part of the Iraq WMD history is that that for a very long time remained classified, and most of it still is classified.

The CIA issued a white paper which was in some ways presented and certainly was interpreted as an unclassified version of the estimate.  That white paper did not acknowledge the dissenting views that were in the estimate on the -- particularly on the nuclear capabilities.  And because that was the example of spinning, and we can't do that, it would put us into an almost Procrustean bed of reproducing as we had in the classified version.  So to bring it to a close, there are rules and procedures that are standards, but we hope not to use them.

HELMAN:  Okay.

Before we open the meeting to questions, let me just remind everyone, the session is on the record.  We have microphones.  Please raise your hand.  When you're called upon, stand, wait for the microphone, identify yourself and your affiliation.  Please keep your questions brief.

Chris.

QUESTIONER:  Thank you.  Just -- my name is Chris Isham -- sorry -- with CBS.

I wonder if you would elaborate a bit more on the NIE relating to the Iran.  It obviously came as a big shock to many people, a lot of our allies, particularly the way it was written.  Do you have any regrets about that?  Do you have any -- do you think it was handled perfectly well?  Are you still comfortable with the conclusions with that particular NIE?

FINGAR:  I'm certainly comfortable with the judgments -- the conclusions that are in it, and nobody who has looked at it has challenged those judgments.

If we had thought that this was going to be released, we would have written the key judgments differently than we did.  But the -- the explicit discussion of this was this is not going to be declassified.  As it was organized, part of the message -- this was going to be people who follow the issue -- was to identify what's different than the last time we did this.  And the piece that is different was the decision to halt the weaponization portion of the program in response to unspecified international pressure and scrutiny.

We thought that was important to put up front for the busy people because the rest -- timelines, that they had as secret nuclear program that they've lied about, that they continue to enrich with centrifuges, to build the fissile material necessary for a weapon, continue to develop the missile delivery capabilities, timelines for weaponization -- there were great many -- most of the judgments did not change.  We had much higher levels of confidence in them than we had in earlier judgments because of new information.

But the -- what's different was the shift from a judgment that Iran was absolutely determined to have a nuclear weapon -- the implication was you can't do anything to prevent that -- to a judgment that says they've halted in response to international pressure and suggesting there's a kind of cost-benefit analysis at work that they've kept, at a minimum, open the option of having a weapon, but maybe there's scope here to work the problem.  We thought that was the most important new finding.

Now, in the way in which it was released and the way in which, in my view, it was spun -- depicted deliberately by some folks to the media and by some media following along -- it took it in an entirely different direction than what we thought was the principal message.

QUESTIONER:  Jeffrey Toobin from CNN and The New Yorker.

I was wondering what -- how much of your reforms relate to the policy environment of this administration?  Or to put it another way, what difference does it make who the president is?

FINGAR:  In terms of what we do, almost none of it, with the exception of the PDB, should be dependent on who the president is.  The PDB is tailored, and it reflects, in this case, a president who spends a great deal of time on intelligence.

Among the things that we are doing now, one of the changes is bringing analysts in two or three times a week to participate in the dialogue with principals, what are called "deep dives" to really bore in on issues.  Whether future presidents will allocate as much time to this, want to do it in that way, we will adjust that to the wishes of the successor.  But the objectivity, desire for clarity, clear expression of confidence levels, distinguish between sort of clinical evidence and judgments that go into this, should not change with a change of administration.

HELMAN:  Sir.

QUESTIONER:  Frank McClane (ph) with -- (off mike).

A psychological factor which we ought to keep in mind, the factor of fear -- if the findings are so horrible in some situations, the decision-maker may push it outside.  The classical historic example is Barbarossa, starting -- (inaudible) -- from the (Roosevelt ?) administration.  He was afraid if he responded to it, he might even provoke the Nazi attack on the Soviet Union.  There may be additional examples when we get more data declassified about planning for Iraq.  My hunch is we'll find some such thing.

One example I could think of -- it's a bit far-fetched, but it's suitable for an on-the-record session -- is the electromagnetic pulse, the havoc that could be caused by one particular nuclear weapon detonated in the atmosphere that would knock out electricity, possibly from coast to coast.  There's an unclassified report by the commission.  There are Russian and Chinese reports on the same subject, but presently we're not doing much about it.  It is awful, and we need money for other things.  It's just a suggestion that we need to keep that in mind, Barbarossa effect, that the -- doesn't lose all priorities, maybe slightly far-fetched warnings which are ugly and people push aside.

FINGAR:  Let me actually answer a question that you didn't ask, but that grows out of it, having to do with the responsibility of policymakers to make decisions.  As I try to instill in young analysts, it's not our responsibility to tell them what to decide; it's our responsibility to give them the knowledge that would help them to make a decision.

At the end of the day, there may be a logic to our analysis that one could conclude would lead to a particular decision or a conscious decision not to decide yet.  That's not our choice.  And if the policymakers and military commanders make a decision that is influenced by factors in addition to what we present, live with it, accept it, because attempting to lead policymakers to a particular choice is wrong.  It is not being objective and a violation of the ethos of the profession.

Trying to provide warning that isn't "Chicken Little," that is alerting -- and it works best when you sort of ramp up and educate people so that they think they discovered it themselves, rather than you came in at the 11th hour ringing a gong is part of the art.

QUESTIONER:  Let me ask you to speak to some of the changes that you've instituted with respect to the production and review of NIEs.  The external review, the greater community coordination, could you speak to some of those, please?

FINGAR:  Yeah.  Possibly the most important change with respect to NIEs was not made by us.  It was made by George Tenet after the Iraq WMD,  and that was to insist on greater clarity into sourcing information.  It started with estimates and drafters of estimates, needed to be given much more information about the sourcing.  We have expanded.  George expanded it.  We have expanded it much more broadly. 

If you've got seven reports, it makes a big difference whether they're seven reports from one individual or from seven individuals, whether there is somebody -- (inaudible) -- or it's a liaison of which you don't know much.  And as analysts sort of got that kind of understanding, they asked different kind of questions about it.  They could go back.  So that's one chance. 

And the scrubbing of sources that is done now as a part of an estimate that each of the collection agencies has to submit a written report addressing each of the items that they produce that is used here.  Do they still stand by it?  Do they have any doubts about it?  Have any questions been raised about the source?  And that's a part of the production. 

Second, we've abandoned the drive for consensus, which has clearly a lowest common denominator element to it.  It's not very meaningful if we all agree that China's a big country with a large population and a rapidly growing economy. 

We've tried to identify early and explicitly where there are analytical disagreements.  Not a democratic, let's see how many agencies think one thing in deciding the credibility of an analytic judgment on the basis of how many agencies vote for it, but what's the power of the argument? 

And rather than saying, somebody dissents from this majority view, if we've got multiple -- usually it's about two; usually it is two -- that each of them be laid out, the reasons for the disagreement be made clear and we make that knowledge available very prominently in the estimative process.  We do the same in the PDB. 

We're trying to convey to those who rely on us for support that smart people, looking at the same information, are coming to different judgments.  That's more important much of the time than who's right, who's wrong here -- that we're working with imperfect information.  And an understanding of the ice being pretty thin under some of these judgments is critical.  So we built that in. 

Other elements are to take out more of the, what I would call, gratuitous references to quotations of intelligence, of source reporting, kind of as a substitute for tough-minded analysis with a quote, as if that somehow made the case, but also jeopardizes the sourcing.  It's analytical judgments that are stitched to whack an opponent or kind of sexy to leak.  People seldom consciously want to leak something about the source or the method, so taking that out, putting it in an end note, not disseminating the end note in order to protect the often fragile access to the information. 

HELMAN:  Tony. 

QUESTIONER:  I'm Tony Holmes, the Cyrus Vance fellow here. 

I'd like to get you to change focus a little bit from slowly evolving strategic issues, like nuclear weapons in Iran, to shorter term issues, and have you say something about the interplay between the collectors, the analysts and the operators. 

Now, I know the United States government never tortures.  But I've heard many times that people, who think they're being tortured, often say things that aren't very accurate.  And I wonder that the things that the people eliciting this information think might be actionable in the very near term, within hours or a couple of days, how much of that is never actually analyzed?  How much of that goes directly to the operations folks to plan on the basis of, and to launch a Predator missile or a B-52 or whatever it is that's leading to this proliferation of reports about all these innocent people being killed in farmhouses, in villages, in other -- in vehicles around the world? 

HELMAN:  Thank you. 

FINGAR:  The short answer, leaving aside the contextual points you make, is double tracking.  The information that has operational utility, to give a squad on the ground in Baghdad which door they ought to go in, where they ought to avoid, that kind of stuff got delivered to them as quickly as possible.  That same information, or information whatever the genesis of it, goes into the larger pot that is then available for subsequent analysis. 

Some of this gets done within the collection room as part of the vetting process.  How has that particular source done?  Does it check out or not check out?  So people get a reputation and a record.  Is it as flawless/seamless as it should be or as it could be?  Not yet.  And an awful lot of it has to do with the pace of operations on the ground.

We continue back in Washington to complain that we're not getting enough of the tactical reporting that is collected in Iraq, for example, that would give us insights into stability/instability in neighborhoods, organizational networks and the like.  This is used and consumed in the field, and there's a sense of, "I don't have time to report this."  And it's been a constant tug-of-war trying to capture more of it, but it has the effect of sort of turning the battlespace into a clerical operation, at least in the minds of some.

HELMAN:  Mr. Tempelsman.

QUESTIONER:  Maurice Tempelsman.

How do these changes relate to foreign intelligence services?  And does it have an implication that you've changed your approach to that?  And are there other changes that the foreign intelligence services are making that are coordinated with this or consistent with this?

FINGAR:  Let me pick out three elements.  There are actually more.

But one has to do with the pretty high degree of interest in what we are doing on the part of other services.  It's been quite public.  They know the catharsis that we went through.  Many realize that they are not well structured for the contemporary world and the world to come; that they have to make changes.  And they're very interested in sort of what we're doing, why we're doing it, is it scalable from us to them and the like.

One of the things that I've observed that actually gives them a lot of heartburn is our collaboration, sharing.  For all of our problems, all of our stovepiping and cultural barriers, we're light years ahead of most of our partners in terms of their willingness to share across their own system.

A second has to do with the kind of excitement about the kind of analytic transformation that I've described, harnessing outside expertise, working with virtually -- dealing with the generation, the digital generation, generation X, that we'd like to learn from and do this.  And some of the close partners want to make sure that as we do this we don't do it in ways that will prevent them from engaging with us.  Let's keep the door open for this.

The third is fear.  When we talk about -- it's a different kind of fear, but it's no less strong here.  When we talk about sharing our information sort of within the larger community, there is a, "Oh, my information is very sensitive, and I don't think I want it seen by all those many people."  And the scale of our community is intimidating to some.  So we get the "We want to be able to have access and take advantage of what you're doing, but we're kind of nervous about our stuff being in it."

I have kind of a flip answer, which is simple.  If you don't trust my analysts to use your material, you shouldn't expect them to be providing any analytic judgment to share with you.  It has -- it resonates.

The final one is in terms of alternative analysis.  One of the things about our partner services, particularly the best ones of them -- getting the perspective of the Spanish on the Maghreb, of Mexicans on pieces of Latin America, of the Australians on a part of the world that they live in -- and going in, same terms of reference, same set of questions, they've got different sorts of information.  They look at it through their policy lenses and mission requirements.  Much of the time we find we're in agreement and it corroborates.  When you respect somebody else and they come up with a different answer, we say,  hello, we'd better dig into this a little more

HELMAN:  We have time for one more, but before we get to that, I'd like to thank you for coming and offer you the opportunity for any last comments you might have.

FINGAR:  No, just that I'm delighted.  I wish there were more time, because I'm enjoying this, but let's go to one more question.

HELMAN:  Okay, we had one more.  Sir.

QUESTIONER:  Scott Malcolmson from the New York Times.  Last July you talked at some length about a kind of ambitious program for using open sources, and it wasn't really part of your talk today.  I was just curious what happened with those ambitions.

FINGAR:  The ambitions are there.  My references to ordinary information, open source, I think I hit in in passing.  I'll make up a number, but magnitude is right.  Probably 75 or 80 percent of the issues we work, you can get 85 or 90 percent of the way there with open source information.  That doesn't help you to penetrate a terrorist group, but it tells you an awful lot about European politics.  It tells you an awful lot about economic issues. 

Not only is that information available, it's often pretty well organized.  It's accessible.  It's identified as to who produced it so you can go ask questions about it and you don't have to wonder who this unknown source of questionable reliability is.  And it makes it possible to reach out to people who don't have clearances but know a great deal and can tell you what they saw when they were on their latest trip to Chile.  And they go back every six months for business, and they can tell you what changed.

  This is terribly important to us.  We have in final stages a draft intelligence community directive, the rules we live by, that mandates that agencies enable analysts to reach out beyond, to simplify the procedures for doing this, reward people who do it.  It ought to be a normal part of what we do, not being fixated on secrets dribbling into the computer's in-box.

HELMAN:  Well, thank you very much again for joining us today.   Thank you all -- (applause).

FINGAR:  Thank you all. 

-------------------------

            (C) COPYRIGHT 2008, FEDERAL NEWS SERVICE, INC., 1000 VERMONT AVE.

NW; 5TH FLOOR; WASHINGTON, DC - 20005, USA.  ALL RIGHTS RESERVED.  ANY REPRODUCTION, REDISTRIBUTION OR RETRANSMISSION IS EXPRESSLY PROHIBITED.

            UNAUTHORIZED REPRODUCTION, REDISTRIBUTION OR RETRANSMISSION CONSTITUTES A MISAPPROPRIATION UNDER APPLICABLE UNFAIR COMPETITION LAW, AND FEDERAL NEWS SERVICE, INC. RESERVES THE RIGHT TO PURSUE ALL REMEDIES AVAILABLE TO IT IN RESPECT TO SUCH MISAPPROPRIATION.

            FEDERAL NEWS SERVICE, INC. IS A PRIVATE FIRM AND IS NOT AFFILIATED WITH THE FEDERAL GOVERNMENT.  NO COPYRIGHT IS CLAIMED AS TO ANY PART OF THE ORIGINAL WORK PREPARED BY A UNITED STATES GOVERNMENT OFFICER OR EMPLOYEE AS PART OF THAT PERSON'S OFFICIAL DUTIES.

            FOR INFORMATION ON SUBSCRIBING TO FNS, PLEASE CALL CARINA NYBERG AT 202-347-1400.

            THIS IS A RUSH TRANSCRIPT.

-------------------------

JOSEPH J. HELMAN:  As you all gather your food and grab your seat, I'd like to welcome you to today's meeting at the Council on Foreign Relations.  I'm Joe Helman, serving as this year's national intelligence fellow at the council.  This is the third meeting of the series on national intelligence.

Before we begin, I'd like to remind you to please turn off your cell phones and all other electronic devices.

We are not under standard council rules today.  Today's meeting is on the record.  Let me repeat that.  Today's meeting is very much on the record.  Please keep this in mind when we open the meeting up for questions.

We are fortunate to be joined today by Tom Fingar, who since May 2005 has served as the deputy director for National Intelligence for Analysis and chairman of the National Intelligence Council.  You have his biographical information in today's program.

Dr. Fingar has 27 years' experience as an intelligence professional.  In his current position, his responsibilities include enhancing analytic support to intelligence consumers, and he manages the production of National Intelligence Estimates and the President's Daily Brief.  He is a leader and innovator in intelligence reform, which I know he will speak to today.

It is a privilege and a pleasure to welcome Tom Fingar to the Council on Foreign Relations.  (Applause.)

C. THOMAS FINGAR:  Thank you, Joe.  Thank you, Richard, for inviting me to come up here to speak.  Thank you all for giving up a part of your day.

I guess I owe a special thanks to Joe for dropping 10 years off my experience, which I interpret as 10 years off my age.

What I'd like to do today, in 20 minutes or so, is kind of a blitz-like run through some of the things that we are doing under the rubric of analytic transformation, which is one component of the changes in the intelligence community that were made both necessary and possible by the 9/11 commission, the WMD commission report and the IRTPA legislation that created us. 

And I view this as a once in a couple of generation(s) opportunity to tackle problems and attempt changes that very few people have had a chance to undertake, changes that are necessary.

And in a somewhat perverse way, we've got a situation in which people around the top of the intelligence community have worked with one another, known one another, mostly respected and mostly liked one another for several decades.  And we do have a sense of they've -- they've put the inmates in charge of the asylum for a while.  And what are we going to do now that we've been given the opportunity to fix those things that we've just complained about for a long time? 

As a mnemonic device, as I run through this very quickly, with very few examples or illustrations, I'd like to touch upon elements of the job which I would summarize under the headings of task, talent, tradecraft, techniques and tools; with our discussion at the end, if I haven't exceeded my allotted time, (on some of the balance sheets ?), where I think we are in this process.

Let me turn first to task.  To me, it's very simple.  Improve analysis.  The goal is improvement.  It's not transformation, per se.  We're not making changes for the sake of changes.  We're making changes in order to be able to provide better, more insightful, more timely, more targeted analytical support to all of the customers and organizations that we support.  The goal here for analysts is not to make themselves smarter.  It's to make policy better; military decisions more effective, more appropriate; and now -- new to us -- the law enforcement first responder community more able to take advantage of the resources and techniques of the foreign intelligence world and vice versa. 

To improve analysis, we need better analysts.  We have highly motivated, highly professional people.  We also have a youthful workforce, and I'll come back to that.  We need better training, better inculcation of skills, better use of the talent that we've got. 

We need to produce better products.  The critiques of the 9/11 but especially the WMD commission both focused on a single problem and motivated by probably the single worst product generated during my career.  The need to produce analytical reports, analytical assessments that address real issues, provide insight, enhance understanding and improve decision-making, I think, is readily apparent, and nobody should argue contrary to that. 

All of this is aimed at providing better support.  The American taxpayer spends a great deal of money on intelligence.  The purpose of all of this is not to make us smarter.  It's to make the performance of our government officials and agencies better.  The task to do this involves better integrating the community.  We've got a lot of organizations that grew up independently, for reasons, in retrospect, that are actually pretty good, pretty sound.

We need better communication, of what it is we're doing, but more importantly better communication, better understanding of what those we support really need.  Annual work plans that grind out products to be delivered sort of as available isn't particularly responsive to a decision-making agenda that is measured in hours and days and is highly variable. 

We need to provide better guidance to the collectors.  The analytic profile and modus operandi, for as long as I've been in the community, has been a Dear Santa exercise:  I want everything; if I can't have everything, here's a list of 100 things that would be nice; if you can only get me 4 or 5 of them, I'll be happy because I didn't have them before.  We're trying to change that paradigm from one in which the analysts have to be encountering Aladdin and the lamp; you get three wishes here. 

What are the things that one wants to know?  What are the issues to be understood?  What's going to give you the most insight, the most understanding of a problem?  Focus on and sharpen those questions but you're not done yet as an analyst. 

Before we go to the collectors and say, go get it for me, you say, here's where I think you ought to look.  We ought to know enough about the issues we're investigating to provide guidance on where to look for the answer.  And this has to be dynamic.  Most of all, we have to make better use of our resources, not just the people, what we already know. 

We know, or at least we have amassed, a tremendous amount of information, a very large body of analytic products.  Outside of the intelligence community, there is all of the United States government.  And outside of that, there is all of the United States and all of the sectors and all of the libraries, all of the knowledge.  And then go beyond our borders. 

Knowing what we already know and mining what we already have more effectively before run great expense and risk in order to get more, and making much better, more efficient use of freely available open source information, bringing the world, the real world, the way everybody else operates, much more closely integrated into the world of intelligence analysis.  In order to do this effectively, we've got to address our people, the talent challenges. 

Here, what we're trying to do is to build a community of analysts where the organizing rubric is, who can help me solve my problem?  Who can help me answer my question?  Who can I help?  Not within a single organization, not based on kind of a narrow definition of competitiveness -- this agency's better than that agency -- but that looks across the entire community and stitches up people willing, able and really eager to work together on problems. 

I mentioned the demography.  Our community has a serious gray-green divide.  Roughly 55 percent of all analysts in the intelligence community joined after 9/11.  To put that in perspective, half of my analysts are working to provide insights to an administration that has been in office longer than they have been on the job.  The bar is pretty high. 

At the other end, there's a relatively small number of baby boomer veterans who measure their future in the community in terms of tuition checks yet to write.  We're very fortunate that many are staying longer than they have to.  They do it out of commitment and professionalism. 

The shape of this is the letter J, with that long leg of the new people and the short leg of the baby boomers.  And in between, that U trough, are the people not hired, the results of downsizing, rightsizing, peace dividends, new opportunities, outsourcing of skills that appeared. 

So that critical generation of people with sort of 15 years experience that we need, to mentor, that know the business to bring people up the learning curve quickly, is a missing link.  We have relatively few, and they are seriously overstretched. 

So we've got to find ways to deal with this that allow our seasoned veterans to mentor across agencies, that will bring younger analysts up the learning curve quickly enough to begin moving into first line managerial positions in another year or so. 

We also have to take better advantage than we have of the distributed expertise in the community.  If one was truly to sit down, with a blank piece of paper, and design the intelligence community today, it probably wouldn't look the way it does: 16 agencies, actually more analytic components than that. 

But as we looked at this and had two broad models, one was a centralizing model:  Take people away; let's consolidate them, shorten the communication to overcome the cultural problems and work together.  We decided that was a bad model in most instances because the way the community has grown up, it's been in response to specific missions. 

So we have agencies, supporting different, often unique customer sets, with very different missions that have developed very different kinds of expertise.  And the task we have is to harness that expertise wherever it is, to take advantage of synergies, to recognize where there are redundancies that are appropriate and where there are redundancies that are unnecessary, making the entire apparatus much more transparent than it has been. 

So we're going to lash up the analysts and the experts.  We needed to know something that nobody knew in the community when the ODNI was established.  Who is doing what?  What did analysts know?  Where did they work?  What did they work on?  Who did they work for?  What did they produce?  Now agency by agency, people had a pretty good idea.  But across the enterprise as a whole, nobody had asked those questions.  There was a very synthetic budget exercise, but it isn't -- it wasn't very helpful to be a manager and looking at that.

I'll come back to that.

Tradecraft.  If we're going to do all of this better, there's no substitute for sound tradecraft, good analytic methodologies, rigorous adherence to the laws of evidence and inference. 

There are a couple of parts to this.  One is the articulation and enforcement of standards.  We didn't have them before.  The law required me to establish them.  We have.  It was done, as many things have been, through the often painful process of representatives of very different agencies and communities coming together to decide what we needed. 

But the standards for analytic tradecraft have now been issued.  Products across the community are being evaluated against those standards.  We did comparisons, one against another.  We can identify strengths -- relative strengths and weaknesses, common pathologies that need attention.  That can inform training.  It can identify sort of weak managers that need to be perhaps helped a little bit.

We're trying to this point to run this not as a report card where somebody gets an A and somebody else gets a D.  We're trying to do with the continuous feedback mechanism.  So notionally here, to continue the metaphor, by the end of the semester, everybody gets an A.  It'll be a while before everybody gets an A.

Having said that, my job in transforming the analytical community is actually much easier than that of some of my counterparts in other dimensions working collection, working integration with the law enforcement community and the like, because analysis was never as bad as we were depicted as being, in that Iraq WMD estimates should be thought of as the equivalent of your yearbook photo on the worst bad hair day ever.

Analysts across the community, in every discipline, working every problem, every country got tarred with the same brush of incompetence.  It wasn't true. 

So I'm not trying to pick people up from the subbasement and raise them up higher.  The standard of professionalism and competence was actually pretty good.  So we're making something that was pretty good much better.  That's the goal -- to make it as good as the American public deserves.

A portion of the tradecraft involves transparency.  Show and share the homework.  As in scientific kind of exercises, we now insist that arguments and evidence be laid out in a way that somebody cannot only know exactly what you looked at, what your assumptions were, how you weighted evidence and got to the judgments that you had, so it's reproducible, or it can be challenged because you didn't include a certain body of information or a different assessment -- an assumption can be used to close an information gap. 

Part of this is being done because it's good tradecraft.  Part of it is being done to train our younger workforce.  Part of it is being done to rebuild confidence in the intelligence community, confidence in our capabilities.  We need to lay it out very meticulously and show people, not expect them to trust us. 

And finally, we need to teach people.  What we do is too important to rely on on-the-job training.  The vast majority of it is no different than sound analytical methodologies in any field or discipline, but it's more efficient, more effective to teach people than it is to critique them after they have done it wrong.

Part of the teaching, though, that is different involves facilitating, recognizing and rewarding collaboration.  We've got a lot of smart people, but none of them are smart enough by themselves to adequately address the array of very complex fast-moving issues that we're asked to analyze, that we're -- on which we're asked to provide insight.  It's the classic "Two heads are better than one."  Two perspectives or more perspectives are likely to identify flaws in the argument and so forth.

This is an absolutely innate approach of our younger workforce, sort of the digital generation that reads e-mail, talks on a phone, watches television, has a conversation by multitasking at the same time, you know, linked up to people around the world.  For them, it is the natural way to operate.  For the intelligence community, it is somewhere between heretical and previously deemed impossible, but we have to teach people to collaborate and break down the impediments to doing so.

Some of the key techniques that we are using to empower and unleash the potential of the workforce, of the community, as an organization.  Some of these are bottom-up.  You spend much of the first year going around the community, to analysts and agencies.  What impedes you from doing your job?  What do you need to do a better job?  What are the issues that I should take on?  What are the windmills in need of jousting?  What is it that will make the biggest impact?  So we got the bottom-up flow.

The other is top-down; that, as I mentioned earlier, folks at the top of the intel community have known one another for a long time.  We're there in part because, I suppose, we're the last ones standing, but in part it is, we've learned something along the way and have the ability to identify best practices, are open to ideas, suggestions from outside of community, receptive to ideas from respected colleagues who may be in a different part of the bureaucracy, and adapting and clarifying what those best practices are -- again, the transparency, making them known, making it easier for people to choose from a menu of things that work, rather than prescribing a single formula for addressing problems.

The law that created us mandated joint duties.  Modeled on Goldwater-Nichols for the military, it's kind of a rough analogy, but the idea here is that people will be better able to collaborate if they actually walk in somebody else's shoes, experience what it's like to work in that organization, establish a network of friends and colleagues in a different part of the organization that they can activate, utilize day in and day in and day out to do the job better. 

What we're trying to do to build the expertise, to build the sense of community to satisfy the joint duty requirements, is to move towards a reasonably organized set of options that will be career tracks:  each move that sort of utilizes and builds expertise and then moves on to a higher level of responsibility requiring a higher level of knowledge and a more effective network of contacts.

I decided to take advantage of the multiple elements in my job jar to establish some demonstration test bed -- ways to try out what we're doing and make people follow them -- that most of the community -- most of the approach is sort of voluntary, bring people along, build consensus behind it, I believe a necessary way.

But I directly -- I am directly responsible for two reasonably important sectors of activities.  One is the president's daily brief, and the other is the National Intelligence Council, which produces estimates and other products.  In these, I could simply -- since I'm in charge -- specify what the new rules and the procedures would be.  The goal was not to feel like the emperor.  It was to have fora in which there was cross-community participation both for the PDB and for the NIC products.  The standards have always been very high.  Bring people in, have them experience it, go back to their agencies having experienced, rather than simply heard about other ways of doing it.  I'll come back to this, but I think we are readily acknowledged to have had a very big impact on both of these, the PDB and the NIC products.

Make it possible to discover what we already know.  It's been awfully serendipitous.  People had their shoebox full of notes, people had their Rolodex full about the people that they know, people have it in their head -- I remember seeing a really good piece -- I can't remember who wrote it, it was about six months ago.

We're doing a number of things to make this easier.  One is called the Library of National Intelligence.  CIA is the executive agent.  It's now been up and ingesting documents for three months.  It's now ingesting about 2,500 to 3,000 per day.  We'll ingest all disseminated products -- collection products, analytic products -- and once they're in there, you can do all kinds of things.  The Amazon -- "the last person that asked for this report also found these useful."  "This report was used in the following analytic products."  That -- the ability to see if we already have the answers to questions; to look across products done in different agencies at different times and, you know, compare them on locational data, time sequence data, and see what has changed.

We've used in a competitive process that used to have -- we took over, there were 11 proposals nominated.  In the last year, it was 113.  One of the requirements is people have to be from more than one agency.  They have to -- the proposal has to utilize more than one type of intelligence -- signals intelligence, geospatial intelligence, human intelligence.  And the success rate in this is better than 50 percent.  And almost all of them had been problems that somebody had determined were unsolvable -- locating underground facilities, tracking foreign leaders -- that -- really remarkable, utilizing mathematical algorithms, running it against historical data, mining what we already know, outreach, networks of experts.

I spent a long time in INR, which is a little organization.  But the State Department's not little, and the intelligence community's not little.  And knowing who was working on your problem, whether they were in your bureau, in the department, in the intelligence community -- and because of the State Department's conferencing seminar program -- the last year I was assistant secretary, we did 100 -- I'm sorry, 280 seminars -- (inaudible) -- we'll reach out to the experts and bring them in, making that a community-wide approach.

Alternative analysis.  Having taught graduate students for many years, you drive them nuts by taking their set of facts, rearranging them and coming out with an alternative explanation.  It's sort of an automatic thing to do.  We need to be doing it.  Every analyst needs to be doing it.  Multiple hypotheses, rather than being lawyers that have a bottom line and marshal the evidence to prove the case.

Alternative analysis is a deliberate choice, not competitive analysis, which is what is written into some of the old executive order and statues.  They were intended to mean the same thing -- different set of eyes, fresh view -- but over time, the competitive analysis in the intelligence community turned out to be an early 20th century Pulitzer versus Hearst, scooping one another -- kind of a rather silly exercise at becoming the first one to misinform policymakers.  We want people to get it right, not be there first.

A couple words on tools, and then I will bring this to a close and take your questions.

Tools is a bad work, a dirty word in the analytic community.  There have been so many false promises.  There have been so many versions of a -- here's a new cell phone that does 46 more things.  You probably have no use for 38 of them, but we want you to have them.  We're trying to get tools that people actually want and are developing them out of the analytic side.  And the use of them is not a diversion, it's a way to do work, a way to do business.

Let me just mention a few.  We didn't know who did what in the intelligence community.  Probably 10 years ago, my good friend John Gannon came up with the idea of a database -- an analytic resource catalog -- who knows what language, served in what countries, what kind of experience and so forth, so that you could find people.  John couldn't get it started.  Sort of the -- there were a lot of reasons why this was a bad idea.  I thought it was a good idea.  Mark Lowenthal actually got it started, but nobody would put data into it because it was billed as a "This is where we'll go if we need to build a task force.  We needed a Serbo-Croat speaker to send out to East Armpit."  And people didn't put that data in.  Others -- managers saw it as a free-agent list.  If I advertise what talent I've got, somebody -- probably Fingar in INR -- will try and steal it.

I approached -- as I said, this is a mixed database of expertise.  If you're not in it, it means one of two things:  You don't know anything, or your boss thinks you don't know anything.  If you don't know anything, you're not writing for the president, you're not briefing on the Hill, you're not participating in National Intelligence Council activities.

I said this to a group of community seniors about as large as this group.  Sitting next to me was Pat Kennedy, who was the deputy for Management at the time.  Pat says, "One more thing.  If you're not in Fingar's database, you're not in a funded position."

We're reconciling the budget documents that go to the Hill through this database.  I suddenly discovered I had 1,200 more analysts than I knew I had, even by estimating.  But we can now reproduce phone book, e-mail directories.  If you need to find an expert on economics in the Andean region, you can find out where they are, how to contact them, and people are using it.

I mentioned the library.  Intellipedia.  It's been written up.  It's the Wikipedia on a classified network, with one very important difference:  it's not anonymous.  We want people to establish a reputation.  If you're really good, we want people to know you're good.  If you're making contributions, we want that known.  If you're an idiot, we want that known too. 

This started out as sort of we built it.  We had some assumptions about the digital generation.  Would they come?  We've now got over 35,000 registered users.  I have about 18,000 analysts.  So there's people in this way beyond it. 

It has become the vehicle of choice.  For some in collaboration, you can stand up portals in a matter of hours to deal with crises.  It took two months less on Intellipedia for this closed community to reach a million edits than it took Wikipedia on the Internet to hit a million edits.  Now that's because of skills learned outside on the Internet being brought in.  It's become a very, very effective way to do things. 

The final tool I'll mention is A-Space, for Analyst Space.  It's been compare to MySpace, but that's not right.  The idea is a digital workspace where people around the community, located around the U.S., located overseas, Baghdad, Kabul, can share information, share work in progress, tee up questions, draw upon databases.

This is also not just an idea; it's up in prototype version, ingesting data.  We've got 250 sort of early adapters trying to beat the crap out of it and tell us what they really need.

In the first iteration of this, we got 170 complaints and suggestions.  We adopted over a hundred of them.  And everybody that complained was put on the panel to design A-Space.  Nobody objected to that.  They took this seriously, as a serious effort to build something that incorporated the requirements that would help them to do the job.

Let me close now with sort of where I think we are.  I think it's an obtainable objective within the analytic transformation (range ?) by the end of this year to have the changes sufficiently locked in, sufficiently clear, institutionalized; that there will be very little interest in dismantling them.  They will have proven their utility.  And to hedge it, they'll be sufficiently cemented in that even if people are inclined to change them, they'll have to blast them out.

The training, the evaluation, the judgments of the improvement in the quality of the support we are providing sort of is a daily reinforcement for what we are doing.

Do we have it completely right?  Of course not.  This is going to take probably another several years to get everything really smoothly in place. But I think we're on the right track, and after now 37 years of being in and around the community, I'm as excited about what we're doing now as anything in my career.

Let me stop and invite questions.  And Joe, I guess you moderate.

HELMAN:  Thanks.  First of all, thank you very much for the view of the landscape of intelligence analysis reform.  I suspect when we open the meeting for questions, there will be at least a couple of NIEs.  So let me start there.

The process -- the decision and the process to declassify the released key judgments -- how did that come about?  And how do you see that taking place with future NIEs, if at all?

FINGAR:  I don't like to release unclassified key judgments.  One of the aspects of the highly politicized environment in Washington was, as we began to build -- rebuild confidence in sort of the analytic community, intelligence community, the Congress began to write into law requirements to produce unclassified versions of key judgments.  We pushed back on all of these successfully -- on most of them, not all, on some with a waiver.  Perceived as sort of an honest observer -- I like being perceived as honest, but I'd rather not produce things that are unclassified. 

The first one that we did was on Iraq -- and a great deal of congressional pressure to do this; frankly, a great amount of concern that it would leak or be leaked, and that we would have no control of what was out there.  And it was -- this was a year ago January -- a very high period of interest.  We decided to do the unclassified version of the key judgments. 

Six months later, when it was time for an update, same kind of pervasive attitude.  We decided to do it again.

We decided on the Iran estimate that we would not do unclassified key judgments.  The nature of the sourcing makes it a -- it was a hard topic to explain.  So we weren't going to do this. 

In October of last year DNI McConnell issued guidelines for declassification.  It is the policy of the DNI not to produce unclassified key judgments.  That's the starting point. 

We lived in Washington.  There are policies, and there are needs to accommodate.

What flowed below that were steps and requirements, and I'd boil them down to if you're going to declassify, you have to do it all, as it was in the original, subject to what you have to take out to protect sources and methods -- no spinning, no cherry-picking, no perception of cherry-picking on this. 

And history matters here, because part of what made it possible for me to be in this job was the Iraq WMD.  Part of the Iraq WMD history is that that for a very long time remained classified, and most of it still is classified.

The CIA issued a white paper which was in some ways presented and certainly was interpreted as an unclassified version of the estimate.  That white paper did not acknowledge the dissenting views that were in the estimate on the -- particularly on the nuclear capabilities.  And because that was the example of spinning, and we can't do that, it would put us into an almost Procrustean bed of reproducing as we had in the classified version.  So to bring it to a close, there are rules and procedures that are standards, but we hope not to use them.

HELMAN:  Okay.

Before we open the meeting to questions, let me just remind everyone, the session is on the record.  We have microphones.  Please raise your hand.  When you're called upon, stand, wait for the microphone, identify yourself and your affiliation.  Please keep your questions brief.

Chris.

QUESTIONER:  Thank you.  Just -- my name is Chris Isham -- sorry -- with CBS.

I wonder if you would elaborate a bit more on the NIE relating to the Iran.  It obviously came as a big shock to many people, a lot of our allies, particularly the way it was written.  Do you have any regrets about that?  Do you have any -- do you think it was handled perfectly well?  Are you still comfortable with the conclusions with that particular NIE?

FINGAR:  I'm certainly comfortable with the judgments -- the conclusions that are in it, and nobody who has looked at it has challenged those judgments.

If we had thought that this was going to be released, we would have written the key judgments differently than we did.  But the -- the explicit discussion of this was this is not going to be declassified.  As it was organized, part of the message -- this was going to be people who follow the issue -- was to identify what's different than the last time we did this.  And the piece that is different was the decision to halt the weaponization portion of the program in response to unspecified international pressure and scrutiny.

We thought that was important to put up front for the busy people because the rest -- timelines, that they had as secret nuclear program that they've lied about, that they continue to enrich with centrifuges, to build the fissile material necessary for a weapon, continue to develop the missile delivery capabilities, timelines for weaponization -- there were great many -- most of the judgments did not change.  We had much higher levels of confidence in them than we had in earlier judgments because of new information.

But the -- what's different was the shift from a judgment that Iran was absolutely determined to have a nuclear weapon -- the implication was you can't do anything to prevent that -- to a judgment that says they've halted in response to international pressure and suggesting there's a kind of cost-benefit analysis at work that they've kept, at a minimum, open the option of having a weapon, but maybe there's scope here to work the problem.  We thought that was the most important new finding.

Now, in the way in which it was released and the way in which, in my view, it was spun -- depicted deliberately by some folks to the media and by some media following along -- it took it in an entirely different direction than what we thought was the principal message.

QUESTIONER:  Jeffrey Toobin from CNN and The New Yorker.

I was wondering what -- how much of your reforms relate to the policy environment of this administration?  Or to put it another way, what difference does it make who the president is?

FINGAR:  In terms of what we do, almost none of it, with the exception of the PDB, should be dependent on who the president is.  The PDB is tailored, and it reflects, in this case, a president who spends a great deal of time on intelligence.

Among the things that we are doing now, one of the changes is bringing analysts in two or three times a week to participate in the dialogue with principals, what are called "deep dives" to really bore in on issues.  Whether future presidents will allocate as much time to this, want to do it in that way, we will adjust that to the wishes of the successor.  But the objectivity, desire for clarity, clear expression of confidence levels, distinguish between sort of clinical evidence and judgments that go into this, should not change with a change of administration.

HELMAN:  Sir.

QUESTIONER:  Frank McClane (ph) with -- (off mike).

A psychological factor which we ought to keep in mind, the factor of fear -- if the findings are so horrible in some situations, the decision-maker may push it outside.  The classical historic example is Barbarossa, starting -- (inaudible) -- from the (Roosevelt ?) administration.  He was afraid if he responded to it, he might even provoke the Nazi attack on the Soviet Union.  There may be additional examples when we get more data declassified about planning for Iraq.  My hunch is we'll find some such thing.

One example I could think of -- it's a bit far-fetched, but it's suitable for an on-the-record session -- is the electromagnetic pulse, the havoc that could be caused by one particular nuclear weapon detonated in the atmosphere that would knock out electricity, possibly from coast to coast.  There's an unclassified report by the commission.  There are Russian and Chinese reports on the same subject, but presently we're not doing much about it.  It is awful, and we need money for other things.  It's just a suggestion that we need to keep that in mind, Barbarossa effect, that the -- doesn't lose all priorities, maybe slightly far-fetched warnings which are ugly and people push aside.

FINGAR:  Let me actually answer a question that you didn't ask, but that grows out of it, having to do with the responsibility of policymakers to make decisions.  As I try to instill in young analysts, it's not our responsibility to tell them what to decide; it's our responsibility to give them the knowledge that would help them to make a decision.

At the end of the day, there may be a logic to our analysis that one could conclude would lead to a particular decision or a conscious decision not to decide yet.  That's not our choice.  And if the policymakers and military commanders make a decision that is influenced by factors in addition to what we present, live with it, accept it, because attempting to lead policymakers to a particular choice is wrong.  It is not being objective and a violation of the ethos of the profession.

Trying to provide warning that isn't "Chicken Little," that is alerting -- and it works best when you sort of ramp up and educate people so that they think they discovered it themselves, rather than you came in at the 11th hour ringing a gong is part of the art.

QUESTIONER:  Let me ask you to speak to some of the changes that you've instituted with respect to the production and review of NIEs.  The external review, the greater community coordination, could you speak to some of those, please?

FINGAR:  Yeah.  Possibly the most important change with respect to NIEs was not made by us.  It was made by George Tenet after the Iraq WMD,  and that was to insist on greater clarity into sourcing information.  It started with estimates and drafters of estimates, needed to be given much more information about the sourcing.  We have expanded.  George expanded it.  We have expanded it much more broadly. 

If you've got seven reports, it makes a big difference whether they're seven reports from one individual or from seven individuals, whether there is somebody -- (inaudible) -- or it's a liaison of which you don't know much.  And as analysts sort of got that kind of understanding, they asked different kind of questions about it.  They could go back.  So that's one chance. 

And the scrubbing of sources that is done now as a part of an estimate that each of the collection agencies has to submit a written report addressing each of the items that they produce that is used here.  Do they still stand by it?  Do they have any doubts about it?  Have any questions been raised about the source?  And that's a part of the production. 

Second, we've abandoned the drive for consensus, which has clearly a lowest common denominator element to it.  It's not very meaningful if we all agree that China's a big country with a large population and a rapidly growing economy. 

We've tried to identify early and explicitly where there are analytical disagreements.  Not a democratic, let's see how many agencies think one thing in deciding the credibility of an analytic judgment on the basis of how many agencies vote for it, but what's the power of the argument? 

And rather than saying, somebody dissents from this majority view, if we've got multiple -- usually it's about two; usually it is two -- that each of them be laid out, the reasons for the disagreement be made clear and we make that knowledge available very prominently in the estimative process.  We do the same in the PDB. 

We're trying to convey to those who rely on us for support that smart people, looking at the same information, are coming to different judgments.  That's more important much of the time than who's right, who's wrong here -- that we're working with imperfect information.  And an understanding of the ice being pretty thin under some of these judgments is critical.  So we built that in. 

Other elements are to take out more of the, what I would call, gratuitous references to quotations of intelligence, of source reporting, kind of as a substitute for tough-minded analysis with a quote, as if that somehow made the case, but also jeopardizes the sourcing.  It's analytical judgments that are stitched to whack an opponent or kind of sexy to leak.  People seldom consciously want to leak something about the source or the method, so taking that out, putting it in an end note, not disseminating the end note in order to protect the often fragile access to the information. 

HELMAN:  Tony. 

QUESTIONER:  I'm Tony Holmes, the Cyrus Vance fellow here. 

I'd like to get you to change focus a little bit from slowly evolving strategic issues, like nuclear weapons in Iran, to shorter term issues, and have you say something about the interplay between the collectors, the analysts and the operators. 

Now, I know the United States government never tortures.  But I've heard many times that people, who think they're being tortured, often say things that aren't very accurate.  And I wonder that the things that the people eliciting this information think might be actionable in the very near term, within hours or a couple of days, how much of that is never actually analyzed?  How much of that goes directly to the operations folks to plan on the basis of, and to launch a Predator missile or a B-52 or whatever it is that's leading to this proliferation of reports about all these innocent people being killed in farmhouses, in villages, in other -- in vehicles around the world? 

HELMAN:  Thank you. 

FINGAR:  The short answer, leaving aside the contextual points you make, is double tracking.  The information that has operational utility, to give a squad on the ground in Baghdad which door they ought to go in, where they ought to avoid, that kind of stuff got delivered to them as quickly as possible.  That same information, or information whatever the genesis of it, goes into the larger pot that is then available for subsequent analysis. 

Some of this gets done within the collection room as part of the vetting process.  How has that particular source done?  Does it check out or not check out?  So people get a reputation and a record.  Is it as flawless/seamless as it should be or as it could be?  Not yet.  And an awful lot of it has to do with the pace of operations on the ground.

We continue back in Washington to complain that we're not getting enough of the tactical reporting that is collected in Iraq, for example, that would give us insights into stability/instability in neighborhoods, organizational networks and the like.  This is used and consumed in the field, and there's a sense of, "I don't have time to report this."  And it's been a constant tug-of-war trying to capture more of it, but it has the effect of sort of turning the battlespace into a clerical operation, at least in the minds of some.

HELMAN:  Mr. Tempelsman.

QUESTIONER:  Maurice Tempelsman.

How do these changes relate to foreign intelligence services?  And does it have an implication that you've changed your approach to that?  And are there other changes that the foreign intelligence services are making that are coordinated with this or consistent with this?

FINGAR:  Let me pick out three elements.  There are actually more.

But one has to do with the pretty high degree of interest in what we are doing on the part of other services.  It's been quite public.  They know the catharsis that we went through.  Many realize that they are not well structured for the contemporary world and the world to come; that they have to make changes.  And they're very interested in sort of what we're doing, why we're doing it, is it scalable from us to them and the like.

One of the things that I've observed that actually gives them a lot of heartburn is our collaboration, sharing.  For all of our problems, all of our stovepiping and cultural barriers, we're light years ahead of most of our partners in terms of their willingness to share across their own system.

A second has to do with the kind of excitement about the kind of analytic transformation that I've described, harnessing outside expertise, working with virtually -- dealing with the generation, the digital generation, generation X, that we'd like to learn from and do this.  And some of the close partners want to make sure that as we do this we don't do it in ways that will prevent them from engaging with us.  Let's keep the door open for this.

The third is fear.  When we talk about -- it's a different kind of fear, but it's no less strong here.  When we talk about sharing our information sort of within the larger community, there is a, "Oh, my information is very sensitive, and I don't think I want it seen by all those many people."  And the scale of our community is intimidating to some.  So we get the "We want to be able to have access and take advantage of what you're doing, but we're kind of nervous about our stuff being in it."

I have kind of a flip answer, which is simple.  If you don't trust my analysts to use your material, you shouldn't expect them to be providing any analytic judgment to share with you.  It has -- it resonates.

The final one is in terms of alternative analysis.  One of the things about our partner services, particularly the best ones of them -- getting the perspective of the Spanish on the Maghreb, of Mexicans on pieces of Latin America, of the Australians on a part of the world that they live in -- and going in, same terms of reference, same set of questions, they've got different sorts of information.  They look at it through their policy lenses and mission requirements.  Much of the time we find we're in agreement and it corroborates.  When you respect somebody else and they come up with a different answer, we say,  hello, we'd better dig into this a little more

HELMAN:  We have time for one more, but before we get to that, I'd like to thank you for coming and offer you the opportunity for any last comments you might have.

FINGAR:  No, just that I'm delighted.  I wish there were more time, because I'm enjoying this, but let's go to one more question.

HELMAN:  Okay, we had one more.  Sir.

QUESTIONER:  Scott Malcolmson from the New York Times.  Last July you talked at some length about a kind of ambitious program for using open sources, and it wasn't really part of your talk today.  I was just curious what happened with those ambitions.

FINGAR:  The ambitions are there.  My references to ordinary information, open source, I think I hit in in passing.  I'll make up a number, but magnitude is right.  Probably 75 or 80 percent of the issues we work, you can get 85 or 90 percent of the way there with open source information.  That doesn't help you to penetrate a terrorist group, but it tells you an awful lot about European politics.  It tells you an awful lot about economic issues. 

Not only is that information available, it's often pretty well organized.  It's accessible.  It's identified as to who produced it so you can go ask questions about it and you don't have to wonder who this unknown source of questionable reliability is.  And it makes it possible to reach out to people who don't have clearances but know a great deal and can tell you what they saw when they were on their latest trip to Chile.  And they go back every six months for business, and they can tell you what changed.

  This is terribly important to us.  We have in final stages a draft intelligence community directive, the rules we live by, that mandates that agencies enable analysts to reach out beyond, to simplify the procedures for doing this, reward people who do it.  It ought to be a normal part of what we do, not being fixated on secrets dribbling into the computer's in-box.

HELMAN:  Well, thank you very much again for joining us today.   Thank you all -- (applause).

FINGAR:  Thank you all. 

-------------------------

            (C) COPYRIGHT 2008, FEDERAL NEWS SERVICE, INC., 1000 VERMONT AVE.

NW; 5TH FLOOR; WASHINGTON, DC - 20005, USA.  ALL RIGHTS RESERVED.  ANY REPRODUCTION, REDISTRIBUTION OR RETRANSMISSION IS EXPRESSLY PROHIBITED.

            UNAUTHORIZED REPRODUCTION, REDISTRIBUTION OR RETRANSMISSION CONSTITUTES A MISAPPROPRIATION UNDER APPLICABLE UNFAIR COMPETITION LAW, AND FEDERAL NEWS SERVICE, INC. RESERVES THE RIGHT TO PURSUE ALL REMEDIES AVAILABLE TO IT IN RESPECT TO SUCH MISAPPROPRIATION.

            FEDERAL NEWS SERVICE, INC. IS A PRIVATE FIRM AND IS NOT AFFILIATED WITH THE FEDERAL GOVERNMENT.  NO COPYRIGHT IS CLAIMED AS TO ANY PART OF THE ORIGINAL WORK PREPARED BY A UNITED STATES GOVERNMENT OFFICER OR EMPLOYEE AS PART OF THAT PERSON'S OFFICIAL DUTIES.

            FOR INFORMATION ON SUBSCRIBING TO FNS, PLEASE CALL CARINA NYBERG AT 202-347-1400.

            THIS IS A RUSH TRANSCRIPT.

-------------------------

Top Stories on CFR

Iran

The IRGC is one of the most powerful organizations in Iran, conceived as the principal defender of the 1979 revolution, and now a critical link to Islamist militant groups violently opposed to Israel and the United States.

United States

Each Friday, I examine what is happening with President-elect Donald Trump’s transition to the White House. This week: Presidential transitions are complicated affairs, especially when power passes from one party to another.

Climate Change

The 2024 summit in Azerbaijan comes amid fresh reports showing that global warming levels are accelerating, bringing more intense climate-related disasters and an increased demand for funding to mitigate and protect communities from the effects of climate change.